Leveraging Large Language Models for Bengali Math Word Problem Solving

Advancing Multistep Reasoning with Chain-of-Thought and Efficient Fine-Tuning

Published

May 27, 2025

Authors: B. Paul et al.
Published on Arxiv: 2025-05-27
Link: http://arxiv.org/abs/2505.21354v1
Institutions: Ahsanullah University of Science and Technology, Tejgaon, Dhaka
Keywords: Natural Language Processing, Chain of Thought, Low-Resource Language, Large Language Models, LoRA, Math Word Problems, Bengali, Fine-tuning, Few-shot Learning, Prompt Engineering, Reasoning, Educational Technology

Random Unsplash-style image

Solving Math Word Problems (MWPs) in Bengali poses unique challenges in natural language processing, given Bengali’s status as a low-resource language requiring complex reasoning. Existing resources often lack reasoning-focused annotations, limiting progress in AI models for mathematical problem-solving. While Chain-of-Thought (CoT) prompting has shown improvements in high-resource languages, systematic exploration for Bengali MWPs is lacking.

Building on these needs, the authors propose a comprehensive solution and present several main contributions:

After presenting their methodological advances, the authors detail their experimental results:

Drawing their conclusions from these results, the article highlights future directions and the impact of their contributions: